Size-Depth Tradeoffs for Boolean Fomulae
نویسندگان
چکیده
We present a simplified proof that Brent/Spira restructuring of Boolean formulas can be improved to allow a Boolean formula of size n to be transformed into an equivalent log depth formula of size O(nα) for arbitrary α > 1.
منابع مشابه
Size-Depth Tradeoffs for Boolean Formulae
We present a simplified proof that Brent/Spira restructuring of Boolean formulas can be improved to allow a Boolean formula of size n to be transformed into an equivalent log depth formula of size O(nα) for arbitrary α > 1.
متن کاملArea–Time Performances of Some Neural Computations
The paper aims to show that VLSI efficient implementations of Boolean functions (BFs) using threshold gates (TGs) are possible. First we detail depth-size tradeoffs for COMPARISON when implemented by TGs of variable fan-in (∆); a class of polynomially bounded TG circuits having O (lgn ⁄ lg∆) depth and O (n ⁄ ∆) size for any 3 ≤ ∆ ≤ clgn, improves on the previous known size O (n). We then procee...
متن کاملAverage-case complexity of detecting cliques
The computational problem of testing whether a graph contains a complete subgraph of size k is among the most fundamental problems studied in theoretical computer science. This thesis is concerned with proving lower bounds for k-CLIQUE, as this problem is known. Our results show that, in certain models of computation, solving k-CLIQUE in the average case requires Q(nk/4) resources (moreover, k/...
متن کاملSize-depth-alternation tradeoffs for circuits
A Boolean circuit is a directed acyclic graph with some designated input gates of fan-in zero and one designated output gate of fan-out zero in which all non-input nodes are labeled with or, and, or not. All or and and gates have fan-in two, and all not gates fan-in one. We assume that the gates of a Boolean circuit are arranged in layers; each layer consists of gates whose inputs come only fro...
متن کاملNeural Computing with Small Weights
J ehoshua Bruck IBM Research Division Almaden Research Center San Jose, CA 95120-6099 An important issue in neural computation is the dynamic range of weights in the neural networks. Many experimental results on learning indicate that the weights in the networks can grow prohibitively large with the size of the inputs. Here we address this issue by studying the tradeoffs between the depth and t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Inf. Process. Lett.
دوره 49 شماره
صفحات -
تاریخ انتشار 1994